154 research outputs found

    Hippocrates revisited? Old ideals and new realities

    Get PDF
    Individual genomics has arrived, personal decisions to make use of it are a new reality. What are the implications for the patient–physician relationship? In this article we address three factors that call the traditional concept of confidentiality into question. First, the illusion of absolute data safety, as shown by medical informatics. Second, data sharing as a standard practice in genomics research. Comprehensive data sets are widely accessible. Third, genotyping has become a service that is directly available to consumers. The availability and accessibility of personal health data strongly suggest that the roles in the clinical encounter need to be remodeled. The old ideal of physicians as keepers of confidential information is outstripped by the reality of individuals who decide themselves about the way of using their data

    Routes for breaching and protecting genetic privacy

    Full text link
    We are entering the era of ubiquitous genetic information for research, clinical care, and personal curiosity. Sharing these datasets is vital for rapid progress in understanding the genetic basis of human diseases. However, one growing concern is the ability to protect the genetic privacy of the data originators. Here, we technically map threats to genetic privacy and discuss potential mitigation strategies for privacy-preserving dissemination of genetic data.Comment: Draft for comment

    The ethics of uncertainty for data subjects

    Get PDF
    Modern health data practices come with many practical uncertainties. In this paper, I argue that data subjects’ trust in the institutions and organizations that control their data, and their ability to know their own moral obligations in relation to their data, are undermined by significant uncertainties regarding the what, how, and who of mass data collection and analysis. I conclude by considering how proposals for managing situations of high uncertainty might be applied to this problem. These emphasize increasing organizational flexibility, knowledge, and capacity, and reducing hazard

    A Systematic Review of Re-Identification Attacks on Health Data

    Get PDF
    Privacy legislation in most jurisdictions allows the disclosure of health data for secondary purposes without patient consent if it is de-identified. Some recent articles in the medical, legal, and computer science literature have argued that de-identification methods do not provide sufficient protection because they are easy to reverse. Should this be the case, it would have significant and important implications on how health information is disclosed, including: (a) potentially limiting its availability for secondary purposes such as research, and (b) resulting in more identifiable health information being disclosed. Our objectives in this systematic review were to: (a) characterize known re-identification attacks on health data and contrast that to re-identification attacks on other kinds of data, (b) compute the overall proportion of records that have been correctly re-identified in these attacks, and (c) assess whether these demonstrate weaknesses in current de-identification methods.Searches were conducted in IEEE Xplore, ACM Digital Library, and PubMed. After screening, fourteen eligible articles representing distinct attacks were identified. On average, approximately a quarter of the records were re-identified across all studies (0.26 with 95% CI 0.046-0.478) and 0.34 for attacks on health data (95% CI 0-0.744). There was considerable uncertainty around the proportions as evidenced by the wide confidence intervals, and the mean proportion of records re-identified was sensitive to unpublished studies. Two of fourteen attacks were performed with data that was de-identified using existing standards. Only one of these attacks was on health data, which resulted in a success rate of 0.00013.The current evidence shows a high re-identification rate but is dominated by small-scale studies on data that was not de-identified according to existing standards. This evidence is insufficient to draw conclusions about the efficacy of de-identification methods

    Advancing Tests of Relativistic Gravity via Laser Ranging to Phobos

    Get PDF
    Phobos Laser Ranging (PLR) is a concept for a space mission designed to advance tests of relativistic gravity in the solar system. PLR's primary objective is to measure the curvature of space around the Sun, represented by the Eddington parameter γ\gamma, with an accuracy of two parts in 10710^7, thereby improving today's best result by two orders of magnitude. Other mission goals include measurements of the time-rate-of-change of the gravitational constant, GG and of the gravitational inverse square law at 1.5 AU distances--with up to two orders-of-magnitude improvement for each. The science parameters will be estimated using laser ranging measurements of the distance between an Earth station and an active laser transponder on Phobos capable of reaching mm-level range resolution. A transponder on Phobos sending 0.25 mJ, 10 ps pulses at 1 kHz, and receiving asynchronous 1 kHz pulses from earth via a 12 cm aperture will permit links that even at maximum range will exceed a photon per second. A total measurement precision of 50 ps demands a few hundred photons to average to 1 mm (3.3 ps) range precision. Existing satellite laser ranging (SLR) facilities--with appropriate augmentation--may be able to participate in PLR. Since Phobos' orbital period is about 8 hours, each observatory is guaranteed visibility of the Phobos instrument every Earth day. Given the current technology readiness level, PLR could be started in 2011 for launch in 2016 for 3 years of science operations. We discuss the PLR's science objectives, instrument, and mission design. We also present the details of science simulations performed to support the mission's primary objectives.Comment: 25 pages, 10 figures, 9 table

    Using linked routinely collected health data to describe prostate cancer treatment in New South Wales, Australia: a validation study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Population-based patterns of care studies are important for monitoring cancer care but conducting them is expensive and resource-intensive. Linkage of routinely collected administrative health data may provide an efficient alternative. Our aim was to determine the accuracy of linked routinely collected administrative data for monitoring prostate cancer care in New South Wales (NSW), Australia.</p> <p>Methods</p> <p>The NSW Prostate Cancer Care and Outcomes Study (PCOS), a population-based survey of patterns of care for men aged less than 70 years diagnosed with prostate cancer in NSW, was linked to the NSW Cancer Registry, electronic hospital discharge records and Medicare and Pharmaceutical claims data from Medicare Australia. The main outcome measures were treatment with radical prostatectomy, any radiotherapy, external beam radiotherapy, brachytherapy or androgen deprivation therapy, and cancer staging. PCOS data were considered to represent the true treatment status. The sensitivity and specificity of the administrative data were estimated and relevant patient characteristics were compared using chi-squared tests.</p> <p>Results</p> <p>The validation data set comprised 1857 PCOS patients with treatment information linked to Cancer Registry records. Hospital and Medicare claims data combined described treatment more accurately than either one alone. The combined data accurately recorded radical prostatectomy (96% sensitivity) and brachytherapy (93% sensitivity), but not androgen deprivation therapy (76% sensitivity). External beam radiotherapy was rarely captured (5% sensitivity), but this was improved by including Medicare claims for radiation field setting or dosimetry (86% sensitivity). False positive rates were near 0%. Disease stage comparisons were limited by one-third of cases having unknown stage in the Cancer Registry. Administrative data recorded treatment more accurately for cases in urban areas.</p> <p>Conclusions</p> <p>Cancer Registry and hospital inpatient data accurately captured radical prostatectomy and brachytherapy treatment, but not external beam radiotherapy or disease stage. Medicare claims data substantially improved the accuracy with which all major treatments were recorded. These administrative data combined are valid for population-based studies of some aspects of prostate cancer care.</p

    Overeating, caloric restriction and breast cancer risk by pathologic subtype: the EPIGEICAM study

    Get PDF
    This study analyzes the association of excessive energy intake and caloric restriction with breast cancer (BC) risk taking into account the individual energy needs of Spanish women. We conducted a multicenter matched case-control study where 973 pairs completed lifestyle and food frequency questionnaires. Expected caloric intake was predicted from a linear regression model in controls, including calories consumed as dependent variable, basal metabolic rate as an offset and physical activity as explanatory. Overeating and caloric restriction were defined taking into account the 99% confidence interval of the predicted value. The association with BC risk, overall and by pathologic subtype, was evaluated using conditional and multinomial logistic regression models. While premenopausal women that consumed few calories (>20% below predicted) had lower BC risk (OR = 0.36; 95% CI = 0.21–0.63), postmenopausal women with an excessive intake (≥40% above predicted) showed an increased risk (OR = 2.81; 95% CI = 1.65–4.79). For every 20% increase in relative (observed/predicted) caloric intake the risk of hormone receptor positive (p-trend < 0.001) and HER2+ (p-trend = 0.015) tumours increased 13%, being this figure 7% for triple negative tumours. While high energy intake increases BC risk, caloric restriction could be protective. Moderate caloric restriction, in combination with regular physical activity, could be a good strategy for BC prevention

    Cold gas accretion in galaxies

    Get PDF
    Evidence for the accretion of cold gas in galaxies has been rapidly accumulating in the past years. HI observations of galaxies and their environment have brought to light new facts and phenomena which are evidence of ongoing or recent accretion: 1) A large number of galaxies are accompanied by gas-rich dwarfs or are surrounded by HI cloud complexes, tails and filaments. It may be regarded as direct evidence of cold gas accretion in the local universe. It is probably the same kind of phenomenon of material infall as the stellar streams observed in the halos of our galaxy and M31. 2) Considerable amounts of extra-planar HI have been found in nearby spiral galaxies. While a large fraction of this gas is produced by galactic fountains, it is likely that a part of it is of extragalactic origin. 3) Spirals are known to have extended and warped outer layers of HI. It is not clear how these have formed, and how and for how long the warps can be sustained. Gas infall has been proposed as the origin. 4) The majority of galactic disks are lopsided in their morphology as well as in their kinematics. Also here recent accretion has been advocated as a possible cause. In our view, accretion takes place both through the arrival and merging of gas-rich satellites and through gas infall from the intergalactic medium (IGM). The infall may have observable effects on the disk such as bursts of star formation and lopsidedness. We infer a mean ``visible'' accretion rate of cold gas in galaxies of at least 0.2 Msol/yr. In order to reach the accretion rates needed to sustain the observed star formation (~1 Msol/yr), additional infall of large amounts of gas from the IGM seems to be required.Comment: To appear in Astronomy & Astrophysics Reviews. 34 pages. Full-resolution version available at http://www.astron.nl/~oosterlo/accretionRevie

    High-Density Microwell Chip for Culture and Analysis of Stem Cells

    Get PDF
    With recent findings on the role of reprogramming factors on stem cells, in vitro screening assays for studying (de)-differentiation is of great interest. We developed a miniaturized stem cell screening chip that is easily accessible and provides means of rapidly studying thousands of individual stem/progenitor cell samples, using low reagent volumes. For example, screening of 700,000 substances would take less than two days, using this platform combined with a conventional bio-imaging system. The microwell chip has standard slide format and consists of 672 wells in total. Each well holds 500 nl, a volume small enough to drastically decrease reagent costs but large enough to allow utilization of standard laboratory equipment. Results presented here include weeklong culturing and differentiation assays of mouse embryonic stem cells, mouse adult neural stem cells, and human embryonic stem cells. The possibility to either maintain the cells as stem/progenitor cells or to study cell differentiation of stem/progenitor cells over time is demonstrated. Clonality is critical for stem cell research, and was accomplished in the microwell chips by isolation and clonal analysis of single mouse embryonic stem cells using flow cytometric cell-sorting. Protocols for practical handling of the microwell chips are presented, describing a rapid and user-friendly method for the simultaneous study of thousands of stem cell cultures in small microwells. This microwell chip has high potential for a wide range of applications, for example directed differentiation assays and screening of reprogramming factors, opening up considerable opportunities in the stem cell field

    Secure and scalable deduplication of horizontally partitioned health data for privacy-preserving distributed statistical computation

    Get PDF
    Background Techniques have been developed to compute statistics on distributed datasets without revealing private information except the statistical results. However, duplicate records in a distributed dataset may lead to incorrect statistical results. Therefore, to increase the accuracy of the statistical analysis of a distributed dataset, secure deduplication is an important preprocessing step. Methods We designed a secure protocol for the deduplication of horizontally partitioned datasets with deterministic record linkage algorithms. We provided a formal security analysis of the protocol in the presence of semi-honest adversaries. The protocol was implemented and deployed across three microbiology laboratories located in Norway, and we ran experiments on the datasets in which the number of records for each laboratory varied. Experiments were also performed on simulated microbiology datasets and data custodians connected through a local area network. Results The security analysis demonstrated that the protocol protects the privacy of individuals and data custodians under a semi-honest adversarial model. More precisely, the protocol remains secure with the collusion of up to N − 2 corrupt data custodians. The total runtime for the protocol scales linearly with the addition of data custodians and records. One million simulated records distributed across 20 data custodians were deduplicated within 45 s. The experimental results showed that the protocol is more efficient and scalable than previous protocols for the same problem. Conclusions The proposed deduplication protocol is efficient and scalable for practical uses while protecting the privacy of patients and data custodians
    • …
    corecore